ECE 504 –ST: Deep Learning and Spiking Neural Networks
Fall 2017, University of Idaho
Lecture notes and homeworks/orjects are on BBlearn.
Instructor : Prof John Chiasson
Email : johnchiasson AT boisestate DOT edu
Time : MWF 9:30-10:15 AM
Course dates : Aug 22 – Dec 16, 2016
Location : ALB 212
Office Hours : TBA
Holidays : Fall recess from instruction (TBA)
Final Exam time: TBA.
Textbook – For the first half of the course we will follow:
Neural Networks and Deep Learning by Michael Nielsen.
For the second half of course we will follow:
1. T. Masquelier, R. Guyonneau, S. J. Thorpe (2008) Spike Timing Dependent Plasticity Finds the Start of Repeating Patterns in Continuous Spike Trains. PLoS ONE 3(1): e1377. doi:10.1371/journal.pone.0001377
2. Masquelier, T. and S. J. Thorpe, Unsupervised Learning of Visual Features through Spike Timing Dependent Plasticity, PLoS Computational Biology e31, issue 2, Volume 3, 2007.
3. R. Kheradpisheh, M. Ganjtabesh, S. J. Thorpe, and T. Masquelier, STDP-based spiking deep neural networks for object recognition, arXiv: 1611.0142v1 [es.CV] 4 Nov 2016.
4. Nessler B, Pfeiffer M, Buesing L, Maass W (2013) Bayesian Computation Emerges in Generic Cortical Microcircuits through Spike-Timing-Dependent Plasticity. PLoS Comput Biol 9(4): e1003037. doi:10.1371/journal.pcbi.100303.
Course content –
Convolutional Deep Neural Networks: The first half of this
course will cover deep learning using feedforward and recurrent neural
networks. These approaches use supervised learning which requires globally
minimizing a cost function using the back propagation
algorithm. The details of the back propagation
algorithm will be studied along with various “tricks” to improve the
performance of the network.
Convolutional Deep Spiking Neural Networks: In the second half of the course we study spiking neural networks. These networks have a similar architecture as the aforementioned CNN, but the information is encoded in brain-inspired spikes. These networks have the promise of being implemented in analog electronics with very low power requirements. Rather than using the backpropagation algorithm to learn (update the weights), these networks use spike-timing dependent plasticity (STDP) to update the weights.
PREREQ: Mathematical
maturity and basic programming ability. Students should be comfortable with the
basics of Probability theory and Linear algebra..
Note:
This course will make ample use of Python
programming language and Matlab. The students
are advised to brush up on Python from sources such as https://www.learnpython.org/.
Workload
(Grading)
Homeworks 20%
Final Project 80%
Example Work
ConvNets for Modulation Recognition
Make-Up Policies
Only students presenting medical or official university excuses to the instructor will be allowed to take a make-up exam or other missed assignments. Whenever possible, arrangements should be made with the instructor prior to the regularly scheduled exam or assignment due date. Making these arrangements is entirely the responsibility of the student. Make up exams or other assignments may differ from those given at the regularly scheduled time, and whether an absence is deemed to be excusable is at the discretion of the instructor.
Academic Honesty
Academic honesty is governed by Article II of the University of Idaho’s Student Code of Conduct http://www.webs.uidaho.edu/fsh/2300.html . Cheating on classroom or outside assignments, including examinations is a violation of this code. Incidents of academic dishonesty will be kept on file by the instructor and may be reported to the dean of students. Such instances of academic dishonesty may warrant expulsion from the course and a failing grade. All students should be aware that even one incident of academic dishonesty may also merit expulsion from the University.
Policies
· Homework and exam scores become final one week after they are returned to the class.
· Late submissions of assignments and project reports are not encouraged; however, if you cannot finish in time and submit late before the solutions are available, a 25% per day compounding deduction will be applied on the final grade. (Ex.:100 points assignment submitted 3 days late will be graded on 42 points, 1 day on 75, 2 days 56, 4 days 32, etc.).
· Submission will not be accepted if the solutions are distributed by any means.
· Assignments have to be turned in during class session. I will not accept any assignment dropped in my office mailbox without getting my permission earlier. You may consult with others on assignments, provided you only submit your attempt at the work. Identical assignments will receive a grade of zero and be considered as academic dishonesty case. Assignment is considered one day late if it is not turned in before 12:30 PM on the day it is due.
·
Neither the final exam nor final project will be
returned at the end of the semester.